171 research outputs found
Experiences with Some Benchmarks for Deductive Databases and Implementations of Bottom-Up Evaluation
OpenRuleBench is a large benchmark suite for rule engines, which includes
deductive databases. We previously proposed a translation of Datalog to C++
based on a method that "pushes" derived tuples immediately to places where they
are used. In this paper, we report performance results of various
implementation variants of this method compared to XSB, YAP and DLV. We study
only a fraction of the OpenRuleBench problems, but we give a quite detailed
analysis of each such task and the factors which influence performance. The
results not only show the potential of our method and implementation approach,
but could be valuable for anybody implementing systems which should be able to
execute tasks of the discussed types.Comment: In Proceedings WLP'15/'16/WFLP'16, arXiv:1701.0014
Declarative Output by Ordering Text Pieces
Most real-world programs must produce output. If a deductive database is used to implement database application programs, it should be possible to specify the output declaratively. There is no generally accepted, completely satisfying solution for this. In this paper we propose to specify an output document by defining the position of text pieces (building blocks of the document). These text pieces are then ordered by their position and concatenated. This way of specifying output fits well to the bottom-up way of thinking about rules (from right to left) which is common in deductive databases. Of course, when evaluating such programs, one wants to avoid sorting operations as far as possible. We show how rules involving ordering can be efficiently implemented
Super Logic Programs
The Autoepistemic Logic of Knowledge and Belief (AELB) is a powerful
nonmonotic formalism introduced by Teodor Przymusinski in 1994. In this paper,
we specialize it to a class of theories called `super logic programs'. We argue
that these programs form a natural generalization of standard logic programs.
In particular, they allow disjunctions and default negation of arbibrary
positive objective formulas.
Our main results are two new and powerful characterizations of the static
semant ics of these programs, one syntactic, and one model-theoretic. The
syntactic fixed point characterization is much simpler than the fixed point
construction of the static semantics for arbitrary AELB theories. The
model-theoretic characterization via Kripke models allows one to construct
finite representations of the inherently infinite static expansions.
Both characterizations can be used as the basis of algorithms for query
answering under the static semantics. We describe a query-answering interpreter
for super programs which we developed based on the model-theoretic
characterization and which is available on the web.Comment: 47 pages, revised version of the paper submitted 10/200
Transformation-Based Bottom-Up Computation of the Well-Founded Model
We present a framework for expressing bottom-up algorithms to compute the
well-founded model of non-disjunctive logic programs. Our method is based on
the notion of conditional facts and elementary program transformations studied
by Brass and Dix for disjunctive programs. However, even if we restrict their
framework to nondisjunctive programs, their residual program can grow to
exponential size, whereas for function-free programs our program remainder is
always polynomial in the size of the extensional database (EDB).
We show that particular orderings of our transformations (we call them
strategies) correspond to well-known computational methods like the alternating
fixpoint approach, the well-founded magic sets method and the magic alternating
fixpoint procedure. However, due to the confluence of our calculi, we come up
with computations of the well-founded model that are provably better than these
methods.
In contrast to other approaches, our transformation method treats magic set
transformed programs correctly, i.e. it always computes a relevant part of the
well-founded model of the original program.Comment: 43 pages, 3 figure
Die zentrale Gegenpartei beim unzulÀssigen Erwerb eigener Aktien
Die Konsequenzen eines unzulĂ€ssigen Erwerbs eigener Aktien auf ein RechtsgeschĂ€ft zwischen zwei Handelspartnern sind hinreichend bekannt. Die EinfĂŒhrung einer zentralen Gegenpartei hat jedoch zu grundlegenden VerĂ€nderungen der Vertrags- und Abwicklungsstruktur im Börsenhandel gefĂŒhrt. Ein wirtschaftlich einheitlicher Kauf wird juristisch aufgeteilt in zwei RechtsgeschĂ€fte mit der CCP. Der folgende Beitrag zeigt, wie sich ein unzulĂ€ssiger Erwerb eigener Aktien in diesem System auswirkt und welche Risiken fĂŒr den zentralen Kontrahenten damit einhergehen
Identification of Naturally Processed Hepatitis C Virus-Derived Major Histocompatibility Complex Class I Ligands
Fine mapping of human cytotoxic T lymphocyte (CTL) responses against hepatitis C virus (HCV) is based on external loading of target cells with synthetic peptides which are either derived from prediction algorithms or from overlapping peptide libraries. These strategies do not address putative host and viral mechanisms which may alter processing as well as presentation of CTL epitopes. Therefore, the aim of this proof-of-concept study was to identify naturally processed HCV-derived major histocompatibility complex (MHC) class I ligands. To this end, continuous human cell lines were engineered to inducibly express HCV proteins and to constitutively express high levels of functional HLA-A2. These cell lines were recognized in an HLA-A2-restricted manner by HCV-specific CTLs. Ligands eluted from HLA-A2 molecules isolated from large-scale cultures of these cell lines were separated by high performance liquid chromatography and further analyzed by electrospray ionization quadrupole time of flight mass spectrometry (MS)/tandem MS. These analyses allowed the identification of two HLA-A2-restricted epitopes derived from HCV nonstructural proteins (NS) 3 and 5B (NS31406â1415 and NS5B2594â2602). In conclusion, we describe a general strategy that may be useful to investigate HCV pathogenesis and may contribute to the development of preventive and therapeutic vaccines in the future
Decomposition of Multiple Coverings into More Parts
We prove that for every centrally symmetric convex polygon Q, there exists a
constant alpha such that any alpha*k-fold covering of the plane by translates
of Q can be decomposed into k coverings. This improves on a quadratic upper
bound proved by Pach and Toth (SoCG'07). The question is motivated by a sensor
network problem, in which a region has to be monitored by sensors with limited
battery lifetime
- âŠ